A Bayesian regularization-backpropagation neural network model for peeling computations

نویسندگان

چکیده

A Bayesian regularization-backpropagation neural network (BR-BPNN) model is employed to predict some aspects of the gecko spatula peeling, viz. variation maximum normal and tangential pull-off forces resultant force angle at detachment with peeling angle. K-fold cross validation used improve effectiveness model. The input data taken from finite element (FE) results. trained 75% FE dataset. remaining 25% are utilized behavior. training performance evaluated for every change in number hidden layer neurons determine optimal structure. relative error calculated draw a clear comparison between predicted It shown that BR-BPNN conjunction k-fold technique has significant potential estimate

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Neural Network Model of the Backpropagation Algorithm

We apply a neural network to model neural network learning algorithm itself. The process of weights updating in neural network is observed and stored into file. Later, this data is used to train another network, which then will be able to train neural networks by imitating the trained algorithm. We use backpropagation algorithm for both, for training, and for sampling the training process. We i...

متن کامل

Genetic Algorithm Neural Network Model vs Backpropagation Neural Network Model for GDP Forecasting

This paper evaluates the usefulness of neural networks in GDP forecasting. It is focused on comparing a neural network model trained with genetic algorithm (GANN) to a backpropagation neural network model, both used to forecast the GDP of Albania. Its forecasting is of particular importance in decision-making issues in the field of economy. The conclusion is that the GANN model achieves higher ...

متن کامل

Backpropagation Neural Network Tutorial

The Architecture of BPNN’s A population P of objects that are similar but not identical allows P to be partitioned into a set of K groups, or classes, whereby the objects within the same class are more similar and the objects between classes are more dissimilar. The objects have N attributes (called properties or features) that can be measured (observed) so that each object can be represented b...

متن کامل

A Bayesian Neural Network Model with Extensions a Bayesian Neural Network Model with Extensions

This report deals with a Bayesian neural network in a classiier context. In our network model, the units represent stochastic events, and the state of the units are related to the probability of these events. The basic Bayesian model is a one-layer neural network, which calculates the posterior probabilities of events, given some observed, independent events. The formulas underlying this networ...

متن کامل

A Backpropagation Neural Network for Computer Network Security

In this paper, an efficient and scalable technique for computer network security is presented. On one hand, the decryption scheme and the public key creation used in this work are based on a multi-layer neural network that is trained by backpropagation learning algorithm. On the other hand, the encryption scheme and the private key creation process are based on Boolean algebra. This is a new po...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Adhesion

سال: 2021

ISSN: ['0021-8464', '1563-518X', '1026-5414']

DOI: https://doi.org/10.1080/00218464.2021.2001335